Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Large-scale Distillation
# Large-scale Distillation
Lamini GPT 774M
Based on the gpt2-large architecture, fine-tuned on 2.58 million instruction-tuning samples, this 774M-parameter language model is suitable for natural language instruction response tasks
Large Language Model
Transformers
English
L
MBZUAI
862
13
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase